Oracle inequalities for computationally budgeted model selection

نویسندگان

  • Alekh Agarwal
  • John C. Duchi
  • Peter L. Bartlett
  • Clément Levrard
چکیده

We analyze general model selection procedures using penalized empirical loss minimization under computational constraints. While classical model selection approaches do not consider computational aspects of performing model selection, we argue that any practical model selection procedure must not only trade off estimation and approximation error, but also the effects of the computational effort required to compute empirical minimizers for different function classes. We provide a framework for analyzing such problems, and we give algorithms for model selection under a computational budget. These algorithms satisfy oracle inequalities that show that the risk of the selected model is not much worse than if we had devoted all of our computational budget to the best function class.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Oracle inequalities for computationally adaptive model selection

We analyze general model selection procedures using penalized empirical loss minimization under computational constraints. While classical model selection approaches do not consider computational aspects of performing model selection, we argue that any practical model selection procedure must not only trade off estimation and approximation error, but also the computational effort required to co...

متن کامل

Tail index estimation, concentration and adaptivity

This paper presents an adaptive version of the Hill estimator based on Lespki’s model selection method. This simple data-driven index selection method is shown to satisfy an oracle inequality and is checked to achieve the lower bound recently derived by Carpentier and Kim. In order to establish the oracle inequality, we derive non-asymptotic variance bounds and concentration inequalities for Hi...

متن کامل

Parameter-free online learning via model selection

We introduce an efficient algorithmic framework for model selection in online learning, also known as parameter-free online learning. Departing from previous work, which has focused on highly structured function classes such as nested balls in Hilbert space, we propose a generic meta-algorithm framework that achieves online model selection oracle inequalities under minimal structural assumption...

متن کامل

To “ General Non - Exact Oracle Inequalities for Classes with a Subexponential Envelope ”

We apply Theorem A to the problem of Convex aggregation and show that the optimal rate of Convex aggregation for non-exact oracle inequalities is much faster than the optimal rate for exact oracle inequalities. We apply Theorem B to show that regularized procedures based on a nuclear norm criterion satisfy oracle inequalities with a residual term that decreases like 1/n for every Lq-loss functi...

متن کامل

General Oracle Inequalities for Gibbs Posterior with Application to Ranking

In this paper, we summarize some recent results in Li et al. (2012), which can be used to extend an important PAC-Bayesian approach, namely the Gibbs posterior, to study the nonadditive ranking risk. The methodology is based on assumption-free risk bounds and nonasymptotic oracle inequalities, which leads to nearly optimal convergence rates and optimal model selection to balance the approximati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011